19 research outputs found

    An optimal multitier resource allocation of cloud RAN in 5G using machine learning

    Get PDF
    The networks are evolving drastically since last few years in order to meetuser requirements. For example, the 5G is offering most of the available spec-trum under one umbrella. In this work, we will address the resource allocationproblem in fifth-generation (5G) networks, to be exact in the Cloud Radio AccessNetworks (C-RANs). The radio access network mechanisms involve multiplenetwork topologies that are isolated based on the spectrum bands and it shouldbe enhanced with numerous access technology in the deployment of 5G net-work. The C-RAN is one of the optimal technique to combine all the availablespectral bands. However, existing C-RAN mechanisms lacks the intelligence per-spective on choosing the spectral bands. Thus, C-RAN mechanism requires anadvanced tool to identify network topology to allocate the network resources forsubstantial traffic volumes. Therefore, there is a need to propose a frameworkthat handles spectral resources based on user requirements and network behav-ior. In this work, we introduced a new C-RAN architecture modified as multitierHeterogeneous Cloud Radio Access Networks in a 5G environment. This archi-tecture handles spectral resources efficiently. Based on the simulation analysis,the proposed multitier H-CRAN architecture with improved control unit innetwork management perspective enables augmented granularity, end-to-endoptimization, and guaranteed quality of service by 15 percentages over theexisting system

    A Knowledge-Based Path Optimization Technique for Cognitive Nodes in Smart Grid

    Get PDF
    The cognitive network uses cognitive processes to record data transmission rate among nodes and applies self-learning methods to trace data load points for finding optimal transmission path in the distributed computing environment. Several industrial systems, e.g., data centers, smart grids, etc., have adopted this cognitive paradigm and retrieved the least HOP count paths for processing huge datasets with minimum resource consumption. Therefore, this technique works well in transmitting structured data such as `XML', however, if the data is in unstructured format i.e. `RDF', the transmission technique wraps it with the same layout of payload and eventually returns inaccuracy in calculating traces of data load points due to the abnormal payload layout. In this paper, we propose a knowledge-based optimal routing path analyzer (RORP) that resolves the transmission wrapping issue of the payload by introducing a novel RDF-aware payload-layout. The proposed analyzer uses the enhanced payload layout to transmit unstructured RDF triples with an append pheromone (footsteps) value through cognitive nodes towards the semantic reservoir. The grid performs analytics and returns least HOP count path for processing huge RDF datasets in the cognitive network. The simulation results show that the proposed approach effectively returns the least HOP count path, enhances network performance by minimizing the resource consumption at each of the cognitive nodes and reduces traffic congestion through knowledge-based HOP count analytics technique in the cognitive environment of the smart grid

    Ambient backcom in beyond 5G NOMA networks: A multi-cell resource allocation framework

    Get PDF
    The research of Non-Orthogonal Multiple Access (NOMA) is extensively used to improve the capacity of networks beyond the fifth-generation. The recent merger of NOMA with ambient Backscatter Communication (BackCom), though opening new possibilities for massive connectivity, poses several challenges in dense wireless networks. One of such challenges is the performance degradation of ambient BackCom in multi-cell NOMA networks under the effect of inter-cell interference. Driven by providing an efficient solution to the issue, this article proposes a new resource allocation framework that uses a duality theory approach. Specifically, the sum rate of the multi-cell network with backscatter tags and NOMA user equipments is maximized by formulating a joint optimization problem. To find the efficient base station transmit power and backscatter reflection coefficient in each cell, the original problem is first divided into two subproblems, and then the closed form solution is derived. A comparison with the Orthogonal Multiple Access (OMA) ambient BackCom and pure NOMA transmission has been provided. Simulation results of the proposed NOMA ambient BackCom indicate a significant improvement over the OMA ambient BackCom and pure NOMA in terms of sum-rate gains

    Editorial: Cyber-physical systems: prospects, challenges and role in software-defined networking and blockchains

    Get PDF
    In recent years, cyber-physical systems (CPSs) have gained a lot of attention from academia, industry and government agencies, considered to be the world’s third wave of information technology, following computers and the internet [...

    Multi-Objective Optimum Solutions for IoT-Based Feature Models of Software Product Line

    Get PDF
    A software product line is used for the development of a family of products utilizing the reusability of existing resources with low costs and time to market. Feature Model (FM) is used extensively to manage the common and variable features of a family of products, such as Internet of Things (IoT) applications. In the literature, the binary pattern for nested cardinality constraints (BPNCC) approach has been proposed to compute all possible combinations of development features for IoT applications without violating any relationship constraints. Relationship constraints are a predefined set of rules for the selection of features from an FM. Due to high probability of relationship constraints violations, obtaining optimum features combinations from large IoT-based FMs are a challenging task. Therefore, in order to obtain optimum solutions, in this paper, we have proposed multi-objective optimum-BPNCC that consists of three independent paths (first, second, and third). Furthermore, we applied heuristics on these paths and found that the first path is infeasible due to space and execution time complexity. The second path reduces the space complexity; however, time complexity increases due to the increasing group of features. Among these paths, the performance of the third path is best as it removes optional features that are not required for optimization. In experiments, we calculated the outcomes of all three paths that show the significant improvement of optimum solution without constraint violation occurrence. We theoretically prove that this paper is better than previously proposed optimization algorithms, such as a non-dominated sorting genetic algorithm and an indicator-based evolutionary algorithm

    An Aggregate MapReduce Data Block Placement Strategy for Wireless IoT Edge Nodes in Smart Grid

    Get PDF
    Big data analytics has simplified processing complexity of large dataset in a distributed environment. Many state-of-the-art platforms i.e. smart grid has adopted the processing structure of big data and manages a large volume of data through MapReduce paradigm at distribution ends. Thus, whenever a wireless IoT edge node bundles a sensor dataset into storage media, MapReduce agent performs analytics and generates output into the grid repository. This practice has efficiently reduced the consumption of resources in such a giant network and strengthens other components of the smart grid to perform data analytics through aggregate programming. However, it consumes an operational latency of accessing large dataset from a central repository. As we know that, smart grid processes I/O operations of multi-homing networks, therefore, it accesses large datasets for processing MapReduce jobs at wireless IoT edge nodes. As a result, aggregate MapReduce at wireless IoT edge node produces a network congestion and operational latency problem. To overcome this issue, we propose Wireless IoT Edge-enabled Block Replica Strategy (WIEBRS), that stores in-place, partition-based and multi-homing block replica to respective edge nodes. This reduces the delay latency of accessing datasets for aggregate MapReduce and increases the performance of the job in the smart grid. The simulation results show that WIEBRS effective decreases operational latency with an increment of aggregate MapReduce job performance in the smart grid

    Mitigating 5G security challenges for next-gen industry using quantum computing

    No full text
    5G has been launched in a few countries of the world, so now all focus shifted towards the development of future 6G networks. 5G has connected all aspects of society. Ubiquitous connectivity has opened the doors for more data sharing. Although 5G is providing low latency, higher data rates, and high-speed yet there are some security-related vulnerabilities. Those security issues need to be mitigated for securing 6G networks from existing challenges. Classical cryptography will not remain enough for securing the 6G network. As all classical cryptography can be disabled with the help of quantum mechanics. Therefore, in the place of traditional security solutions, in this article, we have reviewed all the existing quantum solutions of 5G existing security issues to mitigate them and secure 6G in a Future Quantum World

    An effective revocable and traceable public auditing scheme for sensor-based urban cities

    No full text
    The various data collected by urban sensor devices need a huge storage space. The cloud’s powerful calculation ability provides decent performance support and reduces the overhead on local storage. There are many kinds of data collected by urban sensor devices, and the same kind of sensors often need to upload large numbers of files together, which are usually private, so secure cloud storage in group members is essential. However, the cloud may try to modify or hide data for its own benefit, which requires the detection of a third-party auditor. This paper puts forward a new public audit scheme. Some changes have been made to the way of generating the private key, the parameters of key generation come from the feature information set of the same sensor. When the two information sets are close enough, the correctness of the authenticator generated by this file can be verified. When the sensor is revoked due to damage or loss, this scheme will effectively revoke and regenerate the private key with less overhead, thus preventing the leakage of data privacy. This also avoids the huge overhead caused by re-downloading data to generate authenticators in traditional schemes. In addition, our scheme is effective in tracking malicious members and reducing the computing overhead of the group. Finally, the experimental results show the effectiveness and practicality of the proposed scheme

    AI-Oriented Smart Power System Transient Stability: The Rationality, Applications, Challenges and Future Opportunities

    No full text
    Nowadays, the power grid has become an active colossal resource generation and management system due to the wide use of renewable energy and dynamic workloads processed through intelligent information and communication technologies. Several new operations exist, such as power electrification, intelligent information integration on the physical layer, and complex interconnections in the smart grid. These procedures use data-driven deep learning, big data, and machine learning paradigms to efficiently analyze and control electric power system transient problems and resolve technical issues with robust accuracy and timeliness. Thus, artificial intelligence (AI) has become vital to address and resolving issues related to transient stability assessment (TSA) and control generation. In this paper, we provide a comprehensive review on the role of AI and its sub-procedures in addressing problems in TSA. The workflow of the article includes an AI-based intelligent power system structure along with power system TSA and AI-application rationality to transient situations. Outperforms other reviews, this paper discusses the AI-based TSA framework and design process along with intelligent applications and their analytics in power system transient problems. Moreover, we are not limited to AI, but we also combine the direction of big data that is highly compatible with AI, discusses future trends, opportunities, challenges, and open issues of AI-Big data based transient stability assessment in the smart power grid
    corecore